A dynamic gradient approach to Pareto optimization with nonsmooth convex objective functions
نویسندگان
چکیده
In a general Hilbert framework, we consider continuous gradient-like dynamical systems for constrained multiobjective optimization involving non-smooth convex objective functions. Based on the Yosida regularization of the subdifferential operators involved in the system, we obtain the existence of strong global trajectories. We prove a descent property for each objective function, and the convergence of trajectories to weak Pareto minima. This approach provides a dynamical endogenous weighting of the objective functions. Applications are given to cooperative games, inverse problems, and numerical multiobjective optimization.
منابع مشابه
On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملOptimality conditions for Pareto efficiency and proper ideal point in set-valued nonsmooth vector optimization using contingent cone
In this paper, we first present a new important property for Bouligand tangent cone (contingent cone) of a star-shaped set. We then establish optimality conditions for Pareto minima and proper ideal efficiencies in nonsmooth vector optimization problems by means of Bouligand tangent cone of image set, where the objective is generalized cone convex set-valued map, in general real normed spaces.
متن کاملOptimality Conditions and Duality in Nonsmooth Multiobjective Programs
We study nonsmooth multiobjective programming problems involving locally Lipschitz functions and support functions. Two types of Karush-Kuhn-Tucker optimality conditions with support functions are introduced. Sufficient optimality conditions are presented by using generalized convexity and certain regularity conditions. We formulate Wolfe-type dual and Mond-Weirtype dual problems for our nonsmo...
متن کاملA derivative-free comirror algorithm for convex optimization
We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function values for lower-C2 representation of the functions. Our approach is based on a DFO adaptation of...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014